Fire Detection: Predicting Wildfire Behavior Using Aerial Imagery


Background

Widlfires are dangerous and unforeseen fires that burn through natural land. Because they can be insitgated by a myriad of environmental and human factors ranging from lightning strikes and climate change to even unattended campfires, they have become increasingly more common. For instance, the Congressional Research Service mentioned that 2022 alone witnessed approximately 64,000 distinct wildfires in the U.S., which collectively burned almost 7.5 million acres of land across the country. Although wildfires will inevitably occur due to unpreventable causes, applying machine learning to aerial imagery will potentially enable us to predict fire behavior as necessary. We hope that our model's results can be utilized to prevent further fire damage, protect ecosystems and communities, and improve fire management whenever possible.

Source: https://sgp.fas.org/crs/misc/IF10244.pdf

ML Question

How can we create a CNN model that predicts whether a fire is happening in an area through images?

Dataset Description

This image classification dataset is from IEEE database. The images were taken from a video of prescribed burning in Northern Arizona using an aerial imagery technique which uses drone cameras. The almost 40,000 raw images fall into one of two classes: Fire or No Fire.

https://ieee-dataport.org/open-access/flame-dataset-aerial-imagery-pile-burn-detection-using-drones-uavs

Importing Libraries


In [1]:
# Resource: https://www.tensorflow.org/tutorials/load_data/images

import tensorflow as tf
import pathlib
import os
import numpy as np 
import seaborn as sns
from tensorflow import keras
from sklearn.metrics import confusion_matrix
from google.colab import drive
import plotly
import plotly.graph_objs as go
import plotly.io as pio
pio.renderers.default='notebook'

from numpy.random import seed
seed(1)
tf.random.set_seed(2)
In [2]:
drive.mount('/content/drive')
Mounted at /content/drive

Opening Data


In [ ]:
!unzip "/content/drive/MyDrive/data/Training.zip"
In [ ]:
!unzip "/content/drive/MyDrive/data/Test.zip"

Adjusting Images


In [5]:
def change_saturation(img, level):
    img = img.resize((150, 150), PIL.Image.LANCZOS)
    converter = PIL.ImageEnhance.Color(img)
    img = converter.enhance(level)
    return img
In [6]:
def rotate_image(img, angle):
  return img.rotate(angle, PIL.Image.NEAREST, expand = 1)
In [7]:
import PIL
import pathlib
import os

data_dir = pathlib.Path('Training')
Fire = list(data_dir.glob('Fire/*'))
No_Fire = list(data_dir.glob('No_Fire/*'))

for base_path in ['Training_Aug/', 'Test_Aug/']:
  os.mkdir(base_path)
  for sub_path in ['Fire', 'No_Fire']:
    os.mkdir(os.path.join(base_path, sub_path))
In [8]:
for i in range(0, len(Fire), 6):
    im = change_saturation(PIL.Image.open(Fire[i]), 3)
    for j, angle in enumerate([180, 360]):
      im_mod = rotate_image(im, angle)
      im_mod = im_mod.save("Training_Aug/Fire/fire" + str(i+j)+ ".jpg", subsampling=0, quality=100)
In [9]:
for i in range(0, len(No_Fire), 10):
    im = change_saturation(PIL.Image.open(No_Fire[i]), 3)
    for j, angle in enumerate([180, 360]):
      im_mod = rotate_image(im, angle)
      im_mod = im_mod.save("Training_Aug/No_Fire/no_fire" + str(i+j)+ ".jpg", subsampling=0, quality=100)
In [10]:
data_dir = pathlib.Path('Test')
Fire = list(data_dir.glob('Fire/*'))
No_Fire = list(data_dir.glob('No_Fire/*'))
In [11]:
for i in range(0, len(Fire)):
    im = change_saturation(PIL.Image.open(Fire[i]), 3)
    im = im.save("Test_Aug/Fire/fire" + str(i)+ ".jpg", subsampling=0, quality=100)
In [12]:
for i in range(0, len(No_Fire)):
    im = change_saturation(PIL.Image.open(No_Fire[i]), 3)
    im = im.save("Test_Aug/No_Fire/no_fire" + str(i)+ ".jpg", subsampling=0, quality=100)

Preprocessing Data


In [13]:
data_dir_train = pathlib.Path('Training_Aug')
data_dir_test = pathlib.Path('Test_Aug')
img_height = 150
img_width = 150
batch_size = 32
In [14]:
# Partioning dataset into train, validation, and test sets

train_ds = tf.keras.utils.image_dataset_from_directory(
  data_dir_train,
  validation_split=0,
  subset=None,
  seed=123,
  image_size=(img_height, img_width),
  batch_size=batch_size)

'''
train_ds = tf.keras.utils.image_dataset_from_directory(
  data_dir_train,
  validation_split=0.2,
  subset="training",
  seed=123,
  image_size=(img_height, img_width),
  batch_size=batch_size)

val_ds = tf.keras.utils.image_dataset_from_directory(
  data_dir_train,
  validation_split=0.2,
  subset="validation",
  seed=123,
  image_size=(img_height, img_width),
  batch_size=batch_size)
'''

test_ds = tf.keras.utils.image_dataset_from_directory(
  data_dir_test,
  seed=123,
  image_size=(img_height, img_width),
  batch_size=batch_size)
Found 11212 files belonging to 2 classes.
Found 8617 files belonging to 2 classes.

Analyzing Data


In [15]:
class_names = train_ds.class_names
print(class_names)
['Fire', 'No_Fire']
In [16]:
len(train_ds)  # each object in train_ds represents a batch (contains 32 images each)
Out[16]:
351
In [17]:
type(train_ds)
Out[17]:
tensorflow.python.data.ops.dataset_ops.BatchDataset
In [18]:
import matplotlib.pyplot as plt

plt.figure(figsize=(10, 10))
for images, labels in train_ds.take(1): # first batch
  # printing all of the images and labels in batch 1
  for i in range(32):
    ax = plt.subplot(4, 8, i + 1)
    plt.imshow(images[i].numpy().astype("uint8"))
    plt.title(class_names[labels[i]])
    plt.axis("off")

Model Creation


In [19]:
keras.backend.clear_session()

num_classes = 2

# Model 1

'''
model = keras.Sequential([
  keras.layers.InputLayer(input_shape=[254, 254, 3]),
  keras.layers.Rescaling(1./255),
  keras.layers.Conv2D(filters=32, kernel_size=3, activation='relu'),
  keras.layers.MaxPooling2D(),
  keras.layers.Conv2D(filters=32, kernel_size=3, activation='relu'),
  keras.layers.MaxPooling2D(),
  keras.layers.Conv2D(filters=32, kernel_size=3, activation='relu'),
  keras.layers.MaxPooling2D(),
  keras.layers.Flatten(),
  keras.layers.Dense(units=128, activation='relu'),
  keras.layers.Dense(num_classes, activation='softmax')
])

Epoch 1/5
1231/1231 [==============================] - 95s 70ms/step - loss: 0.2218 - accuracy: 0.9176 - val_loss: 0.8101 - val_accuracy: 0.5449
Epoch 2/5
1231/1231 [==============================] - 83s 67ms/step - loss: 0.0737 - accuracy: 0.9785 - val_loss: 1.0356 - val_accuracy: 0.5517
Epoch 3/5
1231/1231 [==============================] - 84s 68ms/step - loss: 0.0450 - accuracy: 0.9873 - val_loss: 1.2428 - val_accuracy: 0.5451
Epoch 4/5
1231/1231 [==============================] - 84s 68ms/step - loss: 0.0329 - accuracy: 0.9911 - val_loss: 1.5365 - val_accuracy: 0.4922
Epoch 5/5
1231/1231 [==============================] - 86s 70ms/step - loss: 0.0255 - accuracy: 0.9940 - val_loss: 1.7235 - val_accuracy: 0.4923
<keras.callbacks.History at 0x7f7c80653100>
'''

# Model 2 

'''
model = keras.models.Sequential([
    keras.layers.InputLayer(input_shape=[img_height, img_width, 3]),
    keras.layers.Conv2D(filters=50, kernel_size=12, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=5, strides=2),    
    keras.layers.Conv2D(filters=50, kernel_size=12, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=5, strides=2),  
    keras.layers.Conv2D(filters=50, kernel_size=12, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=5, strides=2),                    
    keras.layers.Flatten(),
    keras.layers.Dense(units=128, activation='relu'),
    keras.layers.Dropout(0.5),
    keras.layers.Dense(units=64, activation='relu'),
    keras.layers.Dense(units=num_classes, activation='softmax'),
])

Image Dimensions: 180 x 180
Learning Rate: 0.000005
270/270 [==============================] - 13s 45ms/step - loss: 1.7371 - accuracy: 0.6134
[1.737075924873352, 0.6134385466575623]
'''

# Model 3

'''
model = keras.models.Sequential([
    keras.layers.InputLayer(input_shape=[img_height, img_width, 3]),
    keras.layers.Conv2D(filters=50, kernel_size=10, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=5, strides=2),    
    keras.layers.Conv2D(filters=75, kernel_size=10, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=3, strides=2),  
    keras.layers.Conv2D(filters=50, kernel_size=10, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=5, strides=2),    
    keras.layers.Conv2D(filters=50, kernel_size=10, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=5, strides=2),                 
    keras.layers.Flatten(),
    keras.layers.Dense(units=64, activation='relu'),
    keras.layers.Dropout(0.5),
    keras.layers.Dense(units=32, activation='relu'),
    keras.layers.Dropout(0.5),
    keras.layers.Dense(units=16, activation='relu'),
    keras.layers.Dense(units=num_classes, activation='softmax')
])

Image Dimensions: 200 x 200
Learning Rate: 0.000001
Epochs: 5
270/270 [==============================] - 16s 58ms/step - loss: 1.1011 - accuracy: 0.6343
[1.1011264324188232, 0.6343274712562561]

Epochs: 10
270/270 [==============================] - 16s 58ms/step - loss: 0.8591 - accuracy: 0.6525
[0.8591445088386536, 0.652547299861908]
'''
#     keras.layers.Rescaling(1./255),

model = keras.models.Sequential([
    # keras.layers.Rescaling(1./255),
    keras.layers.InputLayer(input_shape=[img_height, img_width, 3]),
    keras.layers.Conv2D(filters=100, kernel_size=10, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=3, strides=2), 
    keras.layers.Conv2D(filters=75, kernel_size=8, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=3, strides=2),  
    keras.layers.Conv2D(filters=50, kernel_size=6, activation='relu', strides=1),         
    keras.layers.MaxPooling2D(pool_size=3, strides=2),               
    keras.layers.Flatten(),
    keras.layers.Dense(units=256, activation='relu'),
    keras.layers.Dropout(0.5),
    keras.layers.Dense(units=128, activation='relu'),
    keras.layers.Dropout(0.5),
    keras.layers.Dense(units=64, activation='relu'),
    keras.layers.Dense(units=2, activation='softmax')
])
In [20]:
model.summary()
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 conv2d (Conv2D)             (None, 141, 141, 100)     30100     
                                                                 
 max_pooling2d (MaxPooling2D  (None, 70, 70, 100)      0         
 )                                                               
                                                                 
 conv2d_1 (Conv2D)           (None, 63, 63, 75)        480075    
                                                                 
 max_pooling2d_1 (MaxPooling  (None, 31, 31, 75)       0         
 2D)                                                             
                                                                 
 conv2d_2 (Conv2D)           (None, 26, 26, 50)        135050    
                                                                 
 max_pooling2d_2 (MaxPooling  (None, 12, 12, 50)       0         
 2D)                                                             
                                                                 
 flatten (Flatten)           (None, 7200)              0         
                                                                 
 dense (Dense)               (None, 256)               1843456   
                                                                 
 dropout (Dropout)           (None, 256)               0         
                                                                 
 dense_1 (Dense)             (None, 128)               32896     
                                                                 
 dropout_1 (Dropout)         (None, 128)               0         
                                                                 
 dense_2 (Dense)             (None, 64)                8256      
                                                                 
 dense_3 (Dense)             (None, 2)                 130       
                                                                 
=================================================================
Total params: 2,529,963
Trainable params: 2,529,963
Non-trainable params: 0
_________________________________________________________________
In [21]:
myLoss = 'sparse_categorical_crossentropy'
myOptimizer = keras.optimizers.Adam(learning_rate=0.000001)  # Learning Rate is key!!
myMetrics=['accuracy']

model.compile(loss=myLoss, optimizer=myOptimizer, metrics=myMetrics)
In [22]:
myEpochs = 15 # change to 10 maybe and increase image dimensions

history = model.fit(
  train_ds,
  validation_data=test_ds,
  epochs=myEpochs
)
Epoch 1/15
351/351 [==============================] - 57s 137ms/step - loss: 12.8684 - accuracy: 0.5568 - val_loss: 10.2218 - val_accuracy: 0.5961
Epoch 2/15
351/351 [==============================] - 47s 133ms/step - loss: 3.3499 - accuracy: 0.6825 - val_loss: 5.4918 - val_accuracy: 0.5961
Epoch 3/15
351/351 [==============================] - 48s 136ms/step - loss: 1.7795 - accuracy: 0.7307 - val_loss: 3.8046 - val_accuracy: 0.5961
Epoch 4/15
351/351 [==============================] - 47s 134ms/step - loss: 1.2258 - accuracy: 0.7529 - val_loss: 2.4934 - val_accuracy: 0.5965
Epoch 5/15
351/351 [==============================] - 47s 134ms/step - loss: 0.8975 - accuracy: 0.7819 - val_loss: 2.1069 - val_accuracy: 0.5984
Epoch 6/15
351/351 [==============================] - 47s 134ms/step - loss: 0.7511 - accuracy: 0.8038 - val_loss: 1.5258 - val_accuracy: 0.6161
Epoch 7/15
351/351 [==============================] - 47s 134ms/step - loss: 0.6282 - accuracy: 0.8200 - val_loss: 1.4054 - val_accuracy: 0.6160
Epoch 8/15
351/351 [==============================] - 47s 134ms/step - loss: 0.5589 - accuracy: 0.8272 - val_loss: 1.2675 - val_accuracy: 0.6262
Epoch 9/15
351/351 [==============================] - 47s 134ms/step - loss: 0.4924 - accuracy: 0.8387 - val_loss: 1.1587 - val_accuracy: 0.6306
Epoch 10/15
351/351 [==============================] - 47s 134ms/step - loss: 0.4692 - accuracy: 0.8473 - val_loss: 1.0535 - val_accuracy: 0.6426
Epoch 11/15
351/351 [==============================] - 47s 133ms/step - loss: 0.4204 - accuracy: 0.8555 - val_loss: 1.1035 - val_accuracy: 0.6395
Epoch 12/15
351/351 [==============================] - 47s 133ms/step - loss: 0.3768 - accuracy: 0.8718 - val_loss: 0.9322 - val_accuracy: 0.6619
Epoch 13/15
351/351 [==============================] - 47s 133ms/step - loss: 0.3397 - accuracy: 0.8771 - val_loss: 0.9311 - val_accuracy: 0.6690
Epoch 14/15
351/351 [==============================] - 47s 134ms/step - loss: 0.3347 - accuracy: 0.8873 - val_loss: 0.9910 - val_accuracy: 0.6710
Epoch 15/15
351/351 [==============================] - 47s 134ms/step - loss: 0.3221 - accuracy: 0.8877 - val_loss: 0.9187 - val_accuracy: 0.6648
In [23]:
results = model.evaluate(test_ds)
print(results)
270/270 [==============================] - 10s 37ms/step - loss: 0.9187 - accuracy: 0.6648
[0.9187424182891846, 0.6648485660552979]
In [24]:
plt.rcParams["figure.figsize"] = (20,10)

plt.subplot(1, 2, 1)

plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='upper left')

plt.subplot(1, 2, 2)

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='upper left')

plt.show()
In [25]:
# Accuracy Graph

accuracy_graph = go.Scatter(
                    x = np.linspace(1, myEpochs, myEpochs),
                    y = history.history['accuracy'],
                    mode = "lines+markers",
                    name = "train",
                    marker = dict(color = 'rgba(16, 112, 2, 0.8)')
                  )
val_accuracy_graph = go.Scatter(
                    x = np.linspace(1, myEpochs, myEpochs),
                    y = history.history['val_accuracy'],
                    mode = "lines+markers",
                    name = "validation (test)",
                    marker = dict(color = 'rgba(112, 27, 27, 0.8)')
                  )
data = [accuracy_graph, val_accuracy_graph]

layout = dict(title=dict(text='Model Accuracy', xanchor='center', yanchor='top', y=0.9, x=0.5, 
                          font=dict(size=25)),
              xaxis= dict(title='Epochs', ticklen=5, zeroline=False),
              yaxis= dict(title='Accuracy', ticklen=5, zeroline=False),
             )
fig = dict(data = data, layout = layout)
plotly.offline.iplot(fig)
In [26]:
# Loss Graph

loss_graph = go.Scatter(
                    x = np.linspace(1, myEpochs, myEpochs),
                    y = history.history['loss'],
                    mode = "lines+markers",
                    name = "train",
                    marker = dict(color = 'rgba(16, 112, 2, 0.8)')
                  )
val_loss_graph = go.Scatter(
                    x = np.linspace(1, myEpochs, myEpochs),
                    y = history.history['val_loss'],
                    mode = "lines+markers",
                    name = "validation (test)",
                    marker = dict(color = 'rgba(112, 27, 27, 0.8)')
                  )
data = [loss_graph, val_loss_graph]

layout = dict(title=dict(text='Model Loss', xanchor='center', yanchor='top', y=0.9, x=0.5, 
                          font=dict(size=25)),
              xaxis= dict(title='Epochs', ticklen=5, zeroline=False),
              yaxis= dict(title='Loss', ticklen=5, zeroline=False),
             )
fig = dict(data = data, layout = layout)
plotly.offline.iplot(fig)
In [27]:
y_actual, y_pred = np.array([]), np.array([])

for images, labels in test_ds:
    y_pred = np.append(y_pred, np.argmax(model.predict(images, batch_size=batch_size), axis=1))
    y_actual = np.append(y_actual, labels.numpy())
1/1 [==============================] - 0s 192ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 36ms/step
1/1 [==============================] - 0s 44ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 32ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 32ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 32ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 32ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 37ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 21ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 19ms/step
1/1 [==============================] - 0s 20ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 83ms/step
In [28]:
cm = confusion_matrix(y_actual, y_pred)
ax = plt.subplot()

sns.heatmap(cm, annot=True, fmt='g', ax=ax)


ax.set_xlabel('Predicted Labels')
ax.set_ylabel('True Labels')
ax.set_title('Confusion Matrix')
ax.xaxis.set_ticklabels(class_names, rotation=0)
ax.yaxis.set_ticklabels(class_names, rotation=0)

plt.show()
In [29]:
# We are trying to maximize recall (flipped labels so that Fire is the positive class and No_Fire is the negative class)

tn, fp, fn, tp = confusion_matrix(np.abs(y_actual-1), np.abs(y_pred-1)).ravel()

recall = tp/(tp + fn)
recall
Out[29]:
0.8575043799883201

Transfer Learning


In [30]:
# Transfer Learning
base_model = keras.applications.xception.Xception(
    weights='imagenet',
    input_shape=(img_height, img_width, 3),
    include_top=False)
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/xception/xception_weights_tf_dim_ordering_tf_kernels_notop.h5
83683744/83683744 [==============================] - 3s 0us/step
In [31]:
base_model.trainable = False

inputs = keras.Input(shape=(img_height, img_width, 3))
x = base_model(inputs, training=False)
x = keras.layers.GlobalAveragePooling2D()(x)
outputs = keras.layers.Dense(units=2, activation='softmax')(x)
model = keras.Model(inputs, outputs)
In [32]:
model.summary()
Model: "model"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 input_3 (InputLayer)        [(None, 150, 150, 3)]     0         
                                                                 
 xception (Functional)       (None, 5, 5, 2048)        20861480  
                                                                 
 global_average_pooling2d (G  (None, 2048)             0         
 lobalAveragePooling2D)                                          
                                                                 
 dense_4 (Dense)             (None, 2)                 4098      
                                                                 
=================================================================
Total params: 20,865,578
Trainable params: 4,098
Non-trainable params: 20,861,480
_________________________________________________________________
In [33]:
myLoss = 'sparse_categorical_crossentropy'
myOptimizer = keras.optimizers.Adam(learning_rate=0.0001)
myMetrics = ['accuracy']

model.compile(loss=myLoss, optimizer=myOptimizer, metrics=myMetrics)
In [34]:
myEpochs = 10

history = model.fit(
  train_ds,
  validation_data=test_ds,
  epochs=myEpochs
)
Epoch 1/10
351/351 [==============================] - 44s 117ms/step - loss: 1.3512 - accuracy: 0.8461 - val_loss: 2.4261 - val_accuracy: 0.6527
Epoch 2/10
351/351 [==============================] - 39s 110ms/step - loss: 0.2098 - accuracy: 0.9465 - val_loss: 2.1258 - val_accuracy: 0.6693
Epoch 3/10
351/351 [==============================] - 40s 114ms/step - loss: 0.1492 - accuracy: 0.9581 - val_loss: 2.1719 - val_accuracy: 0.6590
Epoch 4/10
351/351 [==============================] - 39s 110ms/step - loss: 0.1135 - accuracy: 0.9671 - val_loss: 2.1798 - val_accuracy: 0.6604
Epoch 5/10
351/351 [==============================] - 39s 110ms/step - loss: 0.0905 - accuracy: 0.9730 - val_loss: 2.1530 - val_accuracy: 0.6661
Epoch 6/10
351/351 [==============================] - 39s 110ms/step - loss: 0.0760 - accuracy: 0.9766 - val_loss: 2.1575 - val_accuracy: 0.6767
Epoch 7/10
351/351 [==============================] - 39s 110ms/step - loss: 0.0666 - accuracy: 0.9801 - val_loss: 2.1793 - val_accuracy: 0.6677
Epoch 8/10
351/351 [==============================] - 39s 110ms/step - loss: 0.0578 - accuracy: 0.9823 - val_loss: 2.1782 - val_accuracy: 0.6746
Epoch 9/10
351/351 [==============================] - 39s 110ms/step - loss: 0.0517 - accuracy: 0.9839 - val_loss: 2.1295 - val_accuracy: 0.6752
Epoch 10/10
351/351 [==============================] - 43s 121ms/step - loss: 0.0466 - accuracy: 0.9864 - val_loss: 2.1106 - val_accuracy: 0.6818
In [35]:
results = model.evaluate(test_ds)
print(results)
270/270 [==============================] - 17s 64ms/step - loss: 2.1106 - accuracy: 0.6818
[2.1105687618255615, 0.6817917823791504]
In [36]:
plt.rcParams["figure.figsize"] = (20,10)

plt.subplot(1, 2, 1)

plt.plot(history.history['accuracy'])
plt.plot(history.history['val_accuracy'])
plt.title('model accuracy')
plt.ylabel('accuracy')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='upper left')

plt.subplot(1, 2, 2)

plt.plot(history.history['loss'])
plt.plot(history.history['val_loss'])
plt.title('model loss')
plt.ylabel('loss')
plt.xlabel('epoch')
plt.legend(['train', 'val'], loc='upper left')

plt.show()
In [37]:
accuracy_graph = go.Scatter(
                    x = np.linspace(1, myEpochs, myEpochs),
                    y = history.history['accuracy'],
                    mode = "lines+markers",
                    name = "train",
                    marker = dict(color = 'rgba(16, 112, 2, 0.8)')
                  )
val_accuracy_graph = go.Scatter(
                    x = np.linspace(1, myEpochs, myEpochs),
                    y = history.history['val_accuracy'],
                    mode = "lines+markers",
                    name = "validation (test)",
                    marker = dict(color = 'rgba(112, 27, 27, 0.8)')
                  )
data = [accuracy_graph, val_accuracy_graph]

layout = dict(title=dict(text='Model Accuracy', xanchor='center', yanchor='top', y=0.9, x=0.5, 
                          font=dict(size=25)),
              xaxis= dict(title='Epochs', ticklen=5, zeroline=False),
              yaxis= dict(title='Accuracy', ticklen=5, zeroline=False),
             )
fig = dict(data = data, layout = layout)
plotly.offline.iplot(fig)
In [38]:
# Loss Graph

loss_graph = go.Scatter(
                    x = np.linspace(1, myEpochs, myEpochs),
                    y = history.history['loss'],
                    mode = "lines+markers",
                    name = "train",
                    marker = dict(color = 'rgba(16, 112, 2, 0.8)')
                  )
val_loss_graph = go.Scatter(
                    x = np.linspace(1, myEpochs, myEpochs),
                    y = history.history['val_loss'],
                    mode = "lines+markers",
                    name = "validation (test)",
                    marker = dict(color = 'rgba(112, 27, 27, 0.8)')
                  )
data = [loss_graph, val_loss_graph]

layout = dict(title=dict(text='Model Loss', xanchor='center', yanchor='top', y=0.9, x=0.5, 
                          font=dict(size=25)),
              xaxis= dict(title='Epochs', ticklen=5, zeroline=False),
              yaxis= dict(title='Loss', ticklen=5, zeroline=False),
             )
fig = dict(data = data, layout = layout)
plotly.offline.iplot(fig)
In [39]:
y_actual, y_pred = np.array([]), np.array([])

for images, labels in test_ds:
    y_pred = np.append(y_pred, np.argmax(model.predict(images, batch_size=batch_size), axis=1))
    y_actual = np.append(y_actual, labels.numpy())
1/1 [==============================] - 1s 689ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 22ms/step
1/1 [==============================] - 0s 34ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 44ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 32ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 32ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 40ms/step
1/1 [==============================] - 0s 40ms/step
1/1 [==============================] - 0s 64ms/step
1/1 [==============================] - 0s 42ms/step
1/1 [==============================] - 0s 63ms/step
1/1 [==============================] - 0s 89ms/step
1/1 [==============================] - 0s 74ms/step
1/1 [==============================] - 0s 46ms/step
1/1 [==============================] - 0s 83ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 48ms/step
1/1 [==============================] - 0s 33ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 43ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 36ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 35ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 29ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 34ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 31ms/step
1/1 [==============================] - 0s 28ms/step
1/1 [==============================] - 0s 27ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 26ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 30ms/step
1/1 [==============================] - 0s 24ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 0s 25ms/step
1/1 [==============================] - 1s 664ms/step
In [40]:
cm = confusion_matrix(y_actual, y_pred)
ax = plt.subplot()

sns.heatmap(cm, annot=True, fmt='g', ax=ax)


ax.set_xlabel('Predicted Labels')
ax.set_ylabel('True Labels')
ax.set_title('Confusion Matrix')
ax.xaxis.set_ticklabels(class_names, rotation=0)
ax.yaxis.set_ticklabels(class_names, rotation=0)

plt.show()
In [41]:
# We are trying to maximize recall (flipped labels so that Fire is the positive class and No_Fire is the negative class)

tn, fp, fn, tp = confusion_matrix(np.abs(y_actual-1), np.abs(y_pred-1)).ravel()

recall = tp/(tp + fn)
recall
Out[41]:
0.8020245279345922

Conclusion


While our own CNN netowrk had an accuracy of 0.6648 and a recall of 0.857, the pre-trained network we utilized through Transfer Learning had a slightly higher accuracy of 0.6818 and a lower recall of 0.802. Although the primary goal throughout the model creation and selection process was to maximize accuracy to prevent the model from randomly guessing, we decided that it would be beneficial to focus on maximizing recall (True Positive Rate) as well. Because our model's results could play a significant role when determining whether a certain location has/had fire, we concluded that having the model misclassify No_Fires as Fires is better than having it classify Fires as No_Fires. In other words, focusing on recall allowed us to increase our model's chances of classifying an image as Fire when an actual fire was present because the repercussions of classifying Fire as No_Fire could be disastrous in the real world. Nevertheless, given the nature of the data, creating a model that would maximize both of our performance metrics was a challenging task. As previously mentioned, because our training dataset appeared to contain multiple copies of the same image because each image was a frame from a prescribed burning video, our model was prone to overfitting from the very beginning. Despite trying multiple image augmentation techniques including modifying the saturation levels, rescaling the image, and rotating the images in attempts of diversifying the dataset, our model did not perform as well as we had initially expected. However, we hope that models like ours will allow officials to identify wildfires in a more quick and efficient manner so that further fire damage is prevented.

Future Work


Limitations to Address

There were a couple limitations we faced throughout this project. There were a lack of diversification in the photos, meaning many photos were similar to each other or taken mere seconds apart. The photos in this dataset were taken from one controlled fire which is relatively small compared to wildfires. This is mainly predictly of that one fire and the model may not perform as well when applied to images from another fire. In addition, this is a large dataset which caused the models to take a long time to run, so we mitigated this with techniques such as decreasing the dataset and lowering image size.

Further Analysis

One way to build off of this project is by applying this method to different types of fires. This can be implemented by combining other datasets of fire images. By analyzing a range of images from various fires, the model can have the potential to detect a greater range of fires. We can explore going beyond just classifying Fire vs. No Fire, and look into classifying various types of fires, such crown fires vs surface fires vs ground fires. It would be interesting to investigate the classification of other types of natural disasters beyond wildfires, such as flooding. There is great potential to use deep learning to better detect fires and handle them more efficiently.